Goto

Collaborating Authors

 deep rotation estimation


Review for NeurIPS paper: An Analysis of SVD for Deep Rotation Estimation

Neural Information Processing Systems

Summary and Contributions: *** Post rebuttal update *** I have read the authors' response and considered the modifications they'll introduce based on the feedback. Despite having read all the reviews and the rebuttal, I continue to have serious concerns about this paper and upon further investigation after reading the rebuttal I would like to clarify why I am reducing my score: * Proposition 1 Corollary 1: These are very poorly phrased and the proofs are quite haphazardly written, e.g. Proposition 1 is also quite uninteresting and it's not clear what its significance is; the'proof' is trivial. SVD minimizes the Frob norm) is a straightforward proofs present in any advanced linear algebra textbook. Our contributions include a theoretically motivated analysis" (lines 44-45).


An Analysis of SVD for Deep Rotation Estimation

Neural Information Processing Systems

Symmetric orthogonalization via SVD, and closely related procedures, are well-known techniques for projecting matrices onto O(n) or SO(n). These tools have long been used for applications in computer vision, for example optimal 3D alignment problems solved by orthogonal Procrustes, rotation averaging, or Essential matrix decomposition. Despite its utility in different settings, SVD orthogonalization as a procedure for producing rotation matrices is typically overlooked in deep learning models, where the preferences tend toward classic representations like unit quaternions, Euler angles, and axis-angle, or more recently-introduced methods. Despite the importance of 3D rotations in computer vision and robotics, a single universally effective representation is still missing. Here, we explore the viability of SVD orthogonalization for 3D rotations in neural networks.